Universal Algorithms for Learning Theory Part I : Piecewise Constant Functions
نویسندگان
چکیده
This paper is concerned with the construction and analysis of a universal estimator for the regression problem in supervised learning. Universal means that the estimator does not depend on any a priori assumptions about the regression function to be estimated. The universal estimator studied in this paper consists of a least-square fitting procedure using piecewise constant functions on a partition which depends adaptively on the data. The partition is generated by a splitting procedure which differs from those used in CART algorithms. It is proven that this estimator performs at the optimal convergence rate for a wide class of priors on the regression function. Namely, as will be made precise in the text, if the regression function is in any one of a certain class of approximation spaces (or smoothness spaces of order not exceeding one a limitation resulting because the estimator uses piecewise constants) measured relative to the marginal measure, then the estimator converges to the regression function (in the least squares sense) with an optimal rate of convergence in terms of the number of samples. The estimator is also numerically feasible and can be implemented on-line.
منابع مشابه
Universal algorithms for learning theory Part II : piecewise polynomial functions
This paper is concerned with estimating the regression function fρ in supervised learning by utilizing piecewise polynomial approximations on adaptively generated partitions. The main point of interest is algorithms that with high probability are optimal in terms of the least square error achieved for a given number m of observed data. In a previous paper [1], we have developed for each β > 0 a...
متن کاملHYBRID FUNCTIONS APPROACH AND PIECEWISE CONSTANT FUNCTION BY COLLOCATION METHOD FOR THE NONLINEAR VOLTERRA-FREDHOLM INTEGRAL EQUATIONS
In this work, we will compare two approximation method based on hybrid Legendre andBlock-Pulse functions and a computational method for solving nonlinear Fredholm-Volterraintegral equations of the second kind which is based on replacement of the unknown functionby truncated series of well known Block-Pulse functions (BPfs) expansion
متن کاملOnline Optimization of Smoothed Piecewise Constant Functions
We study online optimization of smoothed piecewise constant functions over the domain [0, 1). This is motivated by the problem of adaptively picking parameters of learning algorithms as in the recently introduced framework by Gupta and Roughgarden (2016). Majority of the machine learning literature has focused on Lipschitz-continuous functions, or functions with bounded gradients. This is with ...
متن کاملA Convex Parametrization of a New Class of Universal Kernel Functions for use in Kernel Learning
We propose a new class of universal kernel functions which admit a linear parametrization using positive semidefinite matrices. These kernels are generalizations of the Sobolev kernel and are defined by piecewise-polynomial functions. The class of kernels is termed “tessellated” as the resulting discriminant is defined piecewise with hyper-rectangular domains whose corners are determined by the...
متن کاملGeneralized methods and solvers for noise removal from piecewise constant signals. II. New methods.
Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literatu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of Machine Learning Research
دوره 6 شماره
صفحات -
تاریخ انتشار 2005